Probability Theory with Naive Bayes Application
Author: Daniel Hassler
Naive Bayes Background
Data
Naive Bayes Classification
Hyperparameter Tuning
Results
Improvements
Though we achieve decent accuracy with the MNIST dataset using MultinomialNB classifier, this model has one big flaw for image classification; it only looks at the discretized values for specific points in the image. What would happen if I shifted the “0” or “9” to a different section of the image (not centered)? it would not be able to classify this case effectively.
A way to fix the above limitation is using convolutional neural networks (CNNs), which is a deep learning classifier used in a lot of computer vision and even NLP related applications. Its main feature is using the idea of a “sliding window” to find more meaningful representations, which means the location of the object we’re classifying is less important.